Kullback-Leibler Information of Consecutive Order Statistics
نویسندگان
چکیده
منابع مشابه
GRADE ESTIMATION OF KULLBACK - LEIBLER INFORMATION NeTMBEW
An estimator of the Kullback-Leibler information number by using its representation as a functional of the grade density is introduced. Its strong consistency is proved under the mild conditions on the grade density. The same approach is used to study the entropy measure of bivariate dependence (mutual information). Some applications to detection theory are also given.
متن کاملBootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...
متن کاملOptimal Kullback-Leibler Aggregation via Information Bottleneck
In this paper, we present a method for reducing a regular, discrete-time Markov chain (DTMC) to another DTMC with a given, typically much smaller number of states. The cost of reduction is defined as the Kullback–Leibler divergence rate between a projection of the original process through a partition function and a DTMC on the correspondingly partitioned state space. Finding the reduced model w...
متن کاملImage Recognition Using Kullback-Leibler Information Discrimination
The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed to complete enumeration of competing hypotheses. Results of an experimental study of the Kullback-Leibler discri...
متن کاملAlternative Kullback-Leibler information entropy for enantiomers.
In our series of studies on quantifying chirality, a new chirality measure is proposed in this work based on the Kullback-Leibler information entropy. The index computes the extra information that the shape function of one enantiomer carries over a normalized shape function of the racemate, while in our previous studies the shape functions of the R and S enantiomers were used considering one as...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications for Statistical Applications and Methods
سال: 2015
ISSN: 2383-4757
DOI: 10.5351/csam.2015.22.5.487